Applying Powell's symmetrical technique to conjugate gradient methods

نویسندگان

  • Dongyi Liu
  • Genqi Xu
چکیده

A new conjugate gradient method is proposed for applying Powell's symmetrical technique to conjugate gradient methods in this paper, which satisfies the sufficient descent property for any line search. Using Wolfe line searches, the global convergence of the method is derived from the spectral analysis of the conjugate gradient iteration matrix and Zoutendijk's condition. Based on this, two concrete descent algorithms are developed. 200s numerical experiments are presented to verify their performance and the numerical results show that these algorithms are competitive compared with the PRP + algorithm. Finally, a new explanation of the relationship between conjugate gradient methods and quasi-Newton methods is discussed.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A three-parameter family of nonlinear conjugate gradient methods

In this paper, we propose a three-parameter family of conjugate gradient methods for unconstrained optimization. The three-parameter family of methods not only includes the already existing six practical nonlinear conjugate gradient methods, but subsumes some other families of nonlinear conjugate gradient methods as its subfamilies. With Powell’s restart criterion, the three-parameter family of...

متن کامل

Crack Detection In Functionally Graded Beams Using Conjugate Gradient Method

In this paper the conjugate gradient (CG) method is employed for identifying the parameters of crack in a functionally graded beam from natural frequency measurement. The crack is modeled as a massless rotational spring with sectional flexibility. By using the Euler-Bernoulli beam theory on two separate beams respectively and applying the compatibility requirements of the crack, the characteris...

متن کامل

A Conjugate Gradient Method with Strong Wolfe-Powell Line Search for Unconstrained Optimization

In this paper, a modified conjugate gradient method is presented for solving large-scale unconstrained optimization problems, which possesses the sufficient descent property with Strong Wolfe-Powell line search. A global convergence result was proved when the (SWP) line search was used under some conditions. Computational results for a set consisting of 138 unconstrained optimization test probl...

متن کامل

A New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems

In this paper‎, ‎two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS})‎ ‎conjugate gradient method are presented to solve unconstrained optimization problems‎. ‎A remarkable property of the proposed methods is that the search direction always satisfies‎ ‎the sufficient descent condition independent of line search method‎, ‎based on eigenvalue analysis‎. ‎The globa...

متن کامل

Handwritten Character Recognition using Modified Gradient Descent Technique of Neural Networks and Representation of Conjugate Descent for Training Patterns

The purpose of this study is to analyze the performance of Back propagation algorithm with changing training patterns and the second momentum term in feed forward neural networks. This analysis is conducted on 250 different words of three small letters from the English alphabet. These words are presented to two vertical segmentation programs which are designed in MATLAB and based on portions (1...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Comp. Opt. and Appl.

دوره 49  شماره 

صفحات  -

تاریخ انتشار 2011